skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Sharma, Prateek"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available November 1, 2026
  2. Free, publicly-accessible full text available August 1, 2026
  3. Free, publicly-accessible full text available May 1, 2026
  4. We develop a comprehensive framework for storing, analyzing, forecasting, and visualizing industrial energy systems consisting of multiple devices and sensors. Our framework models complex energy systems as a dynamic knowledge graph, utilizes a novel machine learning (ML) model for energy forecasting, and visualizes continuous predictions through an interactive dashboard. At the core of this framework is A-RNN, a simple yet efficient model that uses dynamic attention mechanisms for automated feature selection. We validate the model using datasets from two manufacturers and one university testbed containing hundreds of sensors. Our results show that A-RNN forecasts energy usage within 5% of observed values. These enhanced predictions are as much as 50% more accurate than those produced by standard RNN models that rely on individual features and devices. Additionally, A-RNN identifies key features that impact forecasting accuracy, providing interpretability for model forecasts. Our analytics platform is computationally and memory efficient, making it suitable for deployment on edge devices and in manufacturing plants. 
    more » « less
    Free, publicly-accessible full text available May 1, 2026
  5. Free, publicly-accessible full text available June 1, 2026
  6. Free, publicly-accessible full text available July 11, 2026
  7. The carbon footprint associated with large language models (LLMs) is a significant concern, encompassing emissions from their training, inference, experimentation, and storage processes, including operational and embodied carbon emissions. An essential aspect is accurately estimating the carbon impact of emerging LLMs even before their training, which heavily relies on GPU usage. Existing studies have reported the carbon footprint of LLM training, but only one tool, mlco2, can predict the carbon footprint of new neural networks prior to physical training. However, mlco2 has several serious limitations. It cannot extend its estimation to dense or mixture-of-experts (MoE) LLMs, disregards critical architectural parameters, focuses solely on GPUs, and cannot model embodied carbon footprints. Addressing these gaps, we introduce \textit{\carb}, an end-to-end carbon footprint projection model designed for both dense and MoE LLMs. Compared to mlco2, \carb~significantly enhances the accuracy of carbon footprint estimations for various LLMs. The source code is released at \url{https://github.com/SotaroKaneda/MLCarbon}. 
    more » « less